# Multi-source Data Fusion
Llama 3 OffsetBias RM 8B
A reward model trained on the OffsetBias dataset, offering enhanced robustness against biases in evaluation models
Large Language Model
Transformers English

L
NCSOFT
1,782
23
Roberta Tagalog Base
This is a pretrained language model for Tagalog, trained on multi-source data, aimed at improving the performance of Tagalog natural language processing tasks.
Large Language Model
Transformers

R
GKLMIP
23
1
Featured Recommended AI Models